Analysis of limited-memory BFGS on a class of nonsmooth convex functions

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

BFGS convergence to nonsmooth minimizers of convex functions

The popular BFGS quasi-Newton minimization algorithm under reasonable conditions converges globally on smooth convex functions. This result was proved by Powell in 1976: we consider its implications for functions that are not smooth. In particular, an analogous convergence result holds for functions, like the Euclidean norm, that are nonsmooth at the minimizer.

متن کامل

A class of diagonal preconditioners for limited memory BFGS method

A major weakness of the limited memory BFGS (LBFGS) method is that it may converge very slowly on ill-conditioned problems when the identity matrix is used for initialization. Very often, the LBFGS method can adopt a preconditioner on the identity matrix to speed up the convergence. For this purpose, we propose a class of diagonal preconditioners to boost the performance of the LBFGS method. In...

متن کامل

A Numerical Study of Limited Memory BFGS

The application of quasi-Newton methods is widespread in numerical optimization. Independently of the application, the techniques used to update the BFGS matrices seem to play an important role in the performance of the overall method. In this paper we address precisely this issue. We compare two implementations of the limited memory BFGS method for large-scale unconstrained problems. They diie...

متن کامل

On a Class of Nonsmooth Composite Functions

We discuss in this paper a class of nonsmooth functions which can be represented, in a neighborhood of a considered point, as a composition of a positively homogeneous convex function and a smooth mapping which maps the considered point into the null vector. We argue that this is a sufficiently rich class of functions and that such functions have various properties useful for purposes of optimi...

متن کامل

Global convergence of online limited memory BFGS

Global convergence of an online (stochastic) limited memory version of the Broyden-FletcherGoldfarb-Shanno (BFGS) quasi-Newton method for solving optimization problems with stochastic objectives that arise in large scale machine learning is established. Lower and upper bounds on the Hessian eigenvalues of the sample functions are shown to suffice to guarantee that the curvature approximation ma...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IMA Journal of Numerical Analysis

سال: 2020

ISSN: 0272-4979,1464-3642

DOI: 10.1093/imanum/drz052